back-propagation - vertaling naar russisch
Diclib.com
Woordenboek ChatGPT
Voer een woord of zin in in een taal naar keuze 👆
Taal:     

Vertaling en analyse van woorden door kunstmatige intelligentie ChatGPT

Op deze pagina kunt u een gedetailleerde analyse krijgen van een woord of zin, geproduceerd met behulp van de beste kunstmatige intelligentietechnologie tot nu toe:

  • hoe het woord wordt gebruikt
  • gebruiksfrequentie
  • het wordt vaker gebruikt in mondelinge of schriftelijke toespraken
  • opties voor woordvertaling
  • Gebruiksvoorbeelden (meerdere zinnen met vertaling)
  • etymologie

back-propagation - vertaling naar russisch

OPTIMIZATION ALGORITHM FOR ARTIFICIAL NEURAL NETWORKS
Back-propagation; Back propagation; Backprop; Backpropogation; Error back-propagation; BP algorithm; Back prop; Back-Propagation
  • A simple neural network with two input units (each with a single input) and one output unit (with two inputs)
  • Error surface of a linear neuron for a single training case
  • Gradient descent may find a local minimum instead of the global minimum.

back-propagation         

общая лексика

обратное распространение (обратная передача) ошибки обучения [нейронной сети], обратная связь (при [само]обучении нейронной сети)

синоним

backpropagation (bp)

Смотрите также

generalized delta rule

back prop         

строительное дело

подкос траншейных креплений

backpropagation         

общая лексика

bp

Смотрите также

back-propagation

Definitie

back-propagation
(Or "backpropagation") A learning algorithm for modifying a feed-forward neural network which minimises a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the value of the error function on subsequent tests of the inputs. Other gradient-based methods from numerical analysis can be used to train networks more efficiently. Back-propagation makes use of a mathematical trick when the network is simulated on a digital computer, yielding in just two traversals of the network (once forward, and once back) both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.

Wikipedia

Backpropagation

In machine learning, backpropagation is a widely used algorithm for training feedforward artificial neural networks or other parameterized networks with differentiable nodes. It is an efficient application of the Leibniz chain rule (1673) to such networks. It is also known as the reverse mode of automatic differentiation or reverse accumulation, due to Seppo Linnainmaa (1970). The term "back-propagating error correction" was introduced in 1962 by Frank Rosenblatt, but he did not know how to implement this, although Henry J. Kelley had a continuous precursor of backpropagation already in 1960 in the context of control theory.

Backpropagation computes the gradient of a loss function with respect to the weights of the network for a single input–output example, and does so efficiently, computing the gradient one layer at a time, iterating backward from the last layer to avoid redundant calculations of intermediate terms in the chain rule; this can be derived through dynamic programming. Gradient descent, or variants such as stochastic gradient descent, are commonly used.

The term backpropagation strictly refers only to the algorithm for computing the gradient, not how the gradient is used; however, the term is often used loosely to refer to the entire learning algorithm, including how the gradient is used, such as by stochastic gradient descent. In 1985, David E. Rumelhart et al. published an experimental analysis of the technique. This contributed to the popularization of backpropagation and helped to initiate an active period of research in multilayer perceptrons.

Vertaling van &#39back-propagation&#39 naar Russisch